3,855 research outputs found
Uncertainty and Investment in Electricity Generation: the Case of Hydro-Québec
World wide the electricity industry is undergoing a substantial process of restructuring, with an emphasis on the introduction of competition in the generation sector. Competition is ostensibly going to lead to better incentives, both in the use of existing resources and in future investment decisions. One of the main drivers of this new environment will be the increased opportunity for energy sales between what had been, before the introduction of competition, fairly closed markets. These new opportunities may lead to new investments in generation and transmission capacity which will occur in order to take advantage of cost differentials between regions, one of the driving factors in the call for restructuring. Accounting for some of the underlying complexity of electricity systems, specifically equipment availability and load duration curves, this paper illustrates how uncertainty affects investment in generation. We offer a simple 2-region model to analyse this problem, based on the linear programming model of Chaton (1997). Specifically, we analyse the case where one region has access to four generation technologies, differentiated by cost characteristics as well as construction lead times. A second (neighbouring) region has access to only one of the generation technologies, hence the necessary asymmetry between producing regions. Uncertainty is present in the demand for energy in the first market, as well as in the input fuel prices. Given this uncertainty, and the possibility of electricity sales between regions, we investigate and characterise optimal generation investment in the first market as a function of the problem parameters. The model is calibrated with data from Hydro-Québec and the northeastern United States. This application is particularly interesting and relevant, given the abundance of relatively cheap hydroelectric power in Québec, and Hydro-Québec’s self-proclaimed strategic interests in increasing its exports to the northeastern markets. The numerical example illustrates the importance of appropriately modelling the complexity of the electrical system when considering the impacts of restructuring.Electricity Restructuring, Investment under Uncertainty
Dibenzoylhydrazines as Insect Growth Modulators: Topology-Based QSAR Modelling
Dibenzoylhydrazines Xa-(C6H5)a-CO-N-(t-Bu)-NH-CO-(C6H5)b-Yb are efficient insect growth regulators with high activity and selectivity toward lepidopteran and coleopteran pests. For 123 congeneric molecules, a quantitative structure activity relationship model was built in the framework of the QSARINS package using 2D, Topology-based, PaDEL descriptors. Variable selection by GA-MLR allows building an efficient multilinear regression linking pEC50 values to nine structural variables. Robustness and quality of the model were carefully examined at various levels: data-fitting (recall), leave-one (or some) - out, internal and external validation (including random splitting), points not in depth investigated in previous works. Various Machine Learning approaches (Partial Least Squares Regression, Projection Pursuit Regression, Linear Support Vector Machine or Three Layer Perceptron Artificial Neural Network) confirm the validity of the analysis, giving highly consistent results of comparable quality, with only a slight advantage for the three-layer perceptron
Computation of free energy profiles with parallel adaptive dynamics
We propose a formulation of adaptive computation of free energy differences,
in the ABF or nonequilibrium metadynamics spirit, using conditional
distributions of samples of configurations which evolve in time. This allows to
present a truly unifying framework for these methods, and to prove convergence
results for certain classes of algorithms. From a numerical viewpoint, a
parallel implementation of these methods is very natural, the replicas
interacting through the reconstructed free energy. We show how to improve this
parallel implementation by resorting to some selection mechanism on the
replicas. This is illustrated by computations on a model system of
conformational changes.Comment: 4 pages, 1 Figur
A Backward Particle Interpretation of Feynman-Kac Formulae
We design a particle interpretation of Feynman-Kac measures on path spaces
based on a backward Markovian representation combined with a traditional mean
field particle interpretation of the flow of their final time marginals. In
contrast to traditional genealogical tree based models, these new particle
algorithms can be used to compute normalized additive functionals "on-the-fly"
as well as their limiting occupation measures with a given precision degree
that does not depend on the final time horizon.
We provide uniform convergence results w.r.t. the time horizon parameter as
well as functional central limit theorems and exponential concentration
estimates. We also illustrate these results in the context of computational
physics and imaginary time Schroedinger type partial differential equations,
with a special interest in the numerical approximation of the invariant measure
associated to -processes
Inertia in the North American Electricity Industry: Can the Kyoto Protocol Objectives Be Realistically Met?
If they are to be attained, the objectives set in the Kyoto Protocol will impose fundamental changes on the structure of North America's economy. This text highlights the extent of the Kyoto challenge by clearly describing the historical inertia in terms of total market shares for different production technologies of the North American electricity industry. It also compares two potential scenarios of the industry changes needed to attain the Kyoto objectives. The results obtained suggest that it will be virtually impossible to reach the Kyoto objectives within the electricity industry.Kyoto Protocol, Electricity Industry, Technological Change
Reweighting for Nonequilibrium Markov Processes Using Sequential Importance Sampling Methods
We present a generic reweighting method for nonequilibrium Markov processes.
With nonequilibrium Monte Carlo simulations at a single temperature, one
calculates the time evolution of physical quantities at different temperatures,
which greatly saves the computational time. Using the dynamical finite-size
scaling analysis for the nonequilibrium relaxation, one can study the dynamical
properties of phase transitions together with the equilibrium ones. We
demonstrate the procedure for the Ising model with the Metropolis algorithm,
but the present formalism is general and can be applied to a variety of systems
as well as with different Monte Carlo update schemes.Comment: accepted for publication in Phys. Rev. E (Rapid Communications
Recommended from our members
An integrated brain-behavior model for working memory.
Working memory (WM) is a central construct in cognitive neuroscience because it comprises mechanisms of active information maintenance and cognitive control that underpin most complex cognitive behavior. Individual variation in WM has been associated with multiple behavioral and health features including demographic characteristics, cognitive and physical traits and lifestyle choices. In this context, we used sparse canonical correlation analyses (sCCAs) to determine the covariation between brain imaging metrics of WM-network activation and connectivity and nonimaging measures relating to sensorimotor processing, affective and nonaffective cognition, mental health and personality, physical health and lifestyle choices derived from 823 healthy participants derived from the Human Connectome Project. We conducted sCCAs at two levels: a global level, testing the overall association between the entire imaging and behavioral-health data sets; and a modular level, testing associations between subsets of the two data sets. The behavioral-health and neuroimaging data sets showed significant interdependency. Variables with positive correlation to the neuroimaging variate represented higher physical endurance and fluid intelligence as well as better function in multiple higher-order cognitive domains. Negatively correlated variables represented indicators of suboptimal cardiovascular and metabolic control and lifestyle choices such as alcohol and nicotine use. These results underscore the importance of accounting for behavioral-health factors in neuroimaging studies of WM and provide a neuroscience-informed framework for personalized and public health interventions to promote and maintain the integrity of the WM network
Advanced Mid-Water Tools for 4D Marine Data Fusion and Visualization
Mapping and charting of the seafloor underwent a revolution approximately 20 years ago with the introduction of multibeam sonars -- sonars that provided complete, high-resolution coverage of the seafloor rather than sparse measurements. The initial focus of these sonar systems was the charting of depths in support of safety of navigation and offshore exploration; more recently innovations in processing software have led to approaches to characterize seafloor type and for mapping seafloor habitat in support of fisheries research. In recent years, a new generation of multibeam sonars has been developed that, for the first time, have the ability to map the water column along with the seafloor. This ability will potentially allow multibeam sonars to address a number of critical ocean problems including the direct mapping of fish and marine mammals, the location of mid-water targets and, if water column properties are appropriate, a wide range of physical oceanographic processes. This potential relies on suitable software to make use of all of the new available data. Currently, the users of these sonars have a limited view of the mid-water data in real-time and limited capacity to store it, replay it, or run further analysis. The data also needs to be integrated with other sensor assets such as bathymetry, backscatter, sub-bottom, seafloor characterizations and other assets so that a “complete” picture of the marine environment under analysis can be realized. Software tools developed for this type of data integration should support a wide range of sonars with a unified format for the wide variety of mid-water sonar types. This paper describes the evolution and result of an effort to create a software tool that meets these needs, and details case studies using the new tools in the areas of fisheries research, static target search, wreck surveys and physical oceanographic processes
Recommended from our members
Uniform stability of a particle approximation of the optimal filter derivative
Particle methods, also known as Sequential Monte Carlo methods, are a principled set of
algorithms to approximate numerically the optimal lter in non-linear non-Gaussian state-space
models. However, when performing maximum likelihood parameter inference in state-space
models, it is also necessary to approximate the derivative of the optimal lter with respect to
the parameter of the model. Poyiadjis et al. [2005, 2011] present an original particle method to
apoproximate this derivative and it was shown in numerical examples to be numerically stable
in the sense that it did not deteriorate over time. In this paper we theoretically substantiate
this claim. Lp bounds and a central limit theorem for this particle approximation are presented.
Under mixing conditions these Lp bounds and the asymptotic variance are uniformly bounded
with respect to the time index.This is the final version. It first appeared at http://epubs.siam.org/doi/abs/10.1137/140993703
- …